Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 11 de 11
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Cereb Cortex ; 33(10): 6228-6240, 2023 05 09.
Artigo em Inglês | MEDLINE | ID: mdl-36724048

RESUMO

The intention to name an object modulates neural responses during object recognition tasks. However, the nature of this modulation is still unclear. We established whether a core operation in language, i.e. lexical access, can be observed even when the task does not require language (size-judgment task), and whether response selection in verbal versus non-verbal semantic tasks relies on similar neuronal processes. We measured and compared neuronal oscillatory activities and behavioral responses to the same set of pictures of meaningful objects, while the type of task participants had to perform (picture-naming versus size-judgment) and the type of stimuli to measure lexical access (cognate versus non-cognate) were manipulated. Despite activation of words was facilitated when the task required explicit word-retrieval (picture-naming task), lexical access occurred even without the intention to name the object (non-verbal size-judgment task). Activation of words and response selection were accompanied by beta (25-35 Hz) desynchronization and theta (3-7 Hz) synchronization, respectively. These effects were observed in both picture-naming and size-judgment tasks, suggesting that words became activated via similar mechanisms, irrespective of whether the task involves language explicitly. This finding has important implications to understand the link between core linguistic operations and performance in verbal and non-verbal semantic tasks.


Assuntos
Idioma , Percepção Visual , Humanos , Semântica , Linguística , Julgamento/fisiologia
2.
J Neurosci ; 42(11): 2313-2326, 2022 03 16.
Artigo em Inglês | MEDLINE | ID: mdl-35086905

RESUMO

During multisensory speech perception, slow δ oscillations (∼1-3 Hz) in the listener's brain synchronize with the speech signal, likely engaging in speech signal decomposition. Notable fluctuations in the speech amplitude envelope, resounding speaker prosody, temporally align with articulatory and body gestures and both provide complementary sensations that temporally structure speech. Further, δ oscillations in the left motor cortex seem to align with speech and musical beats, suggesting their possible role in the temporal structuring of (quasi)-rhythmic stimulation. We extended the role of δ oscillations to audiovisual asynchrony detection as a test case of the temporal analysis of multisensory prosody fluctuations in speech. We recorded Electroencephalograph (EEG) responses in an audiovisual asynchrony detection task while participants watched videos of a speaker. We filtered the speech signal to remove verbal content and examined how visual and auditory prosodic features temporally (mis-)align. Results confirm (1) that participants accurately detected audiovisual asynchrony, and (2) increased δ power in the left motor cortex in response to audiovisual asynchrony. The difference of δ power between asynchronous and synchronous conditions predicted behavioral performance, and (3) decreased δ-ß coupling in the left motor cortex when listeners could not accurately map visual and auditory prosodies. Finally, both behavioral and neurophysiological evidence was altered when a speaker's face was degraded by a visual mask. Together, these findings suggest that motor δ oscillations support asynchrony detection of multisensory prosodic fluctuation in speech.SIGNIFICANCE STATEMENT Speech perception is facilitated by regular prosodic fluctuations that temporally structure the auditory signal. Auditory speech processing involves the left motor cortex and associated δ oscillations. However, visual prosody (i.e., a speaker's body movements) complements auditory prosody, and it is unclear how the brain temporally analyses different prosodic features in multisensory speech perception. We combined an audiovisual asynchrony detection task with electroencephalographic (EEG) recordings to investigate how δ oscillations support the temporal analysis of multisensory speech. Results confirmed that asynchrony detection of visual and auditory prosodies leads to increased δ power in left motor cortex and correlates with performance. We conclude that δ oscillations are invoked in an effort to resolve denoted temporal asynchrony in multisensory speech perception.


Assuntos
Percepção da Fala , Estimulação Acústica , Percepção Auditiva/fisiologia , Eletroencefalografia , Humanos , Estimulação Luminosa , Fala , Percepção da Fala/fisiologia , Percepção Visual/fisiologia
3.
Curr Res Neurobiol ; 2: 100014, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-36246505

RESUMO

Audiovisual speech perception relies, among other things, on our expertise to map a speaker's lip movements with speech sounds. This multimodal matching is facilitated by salient syllable features that align lip movements and acoustic envelope signals in the 4-8 â€‹Hz theta band. Although non-exclusive, the predominance of theta rhythms in speech processing has been firmly established by studies showing that neural oscillations track the acoustic envelope in the primary auditory cortex. Equivalently, theta oscillations in the visual cortex entrain to lip movements, and the auditory cortex is recruited during silent speech perception. These findings suggest that neuronal theta oscillations may play a functional role in organising information flow across visual and auditory sensory areas. We presented silent speech movies while participants performed a pure tone detection task to test whether entrainment to lip movements directs the auditory system and drives behavioural outcomes. We showed that auditory detection varied depending on the ongoing theta phase conveyed by lip movements in the movies. In a complementary experiment presenting the same movies while recording participants' electro-encephalogram (EEG), we found that silent lip movements entrained neural oscillations in the visual and auditory cortices with the visual phase leading the auditory phase. These results support the idea that the visual cortex entrained by lip movements filtered the sensitivity of the auditory cortex via theta phase synchronization.

4.
J Neurosci Methods ; 343: 108830, 2020 09 01.
Artigo em Inglês | MEDLINE | ID: mdl-32603812

RESUMO

BACKGROUND: Researchers rely on the specified capabilities of their hardware and software even though, in reality, these capabilities are often not achieved. Considering that the number of experiments examining neural oscillations has increased steadily, easy-to-implement tools for testing the capabilities of hardware and software are necessary. NEW METHOD: We present an open-source MATLAB toolbox, the Schultz Cigarette Burn Toolbox (SCiBuT) that allows users to benchmark the capabilities of their visual display devices and align neural and behavioral responses with veridical timing of visual stimuli. Specifically, the toolbox marks the corners of the display with black or white squares to indicate the timing of the onset of static images and the timing of frame changes within videos. Using basic hardware (i.e., a photodiode, an Arduino microcontroller, and an analogue input box), the light changes in the corner of the screen can be captured and synchronized with EEG recordings and/or behavioral responses. RESULTS: We demonstrate that the SCiBuT is sensitive to framerate inconsistencies and provide examples of hardware setups that are suboptimal for measuring fine timing. Finally, we show that inconsistencies in framerate during video presentation can affect EEG oscillations. CONCLUSIONS: The SCiBuT provides tools to benchmark framerates and frame changes and to synchronize frame changes with neural and behavioral signals. This is the first open-source toolbox that can perform these functions. The SCiBuT can be freely downloaded (www.band-lab.com/scibut) and be used during experimental trials to improve the accuracy and precision of timestamps to ensure videos are presented at the intended framerate.


Assuntos
Computadores , Software
5.
Int J Psychol ; 55(3): 342-346, 2020 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-31062352

RESUMO

The informative value of time and temporal structure often remains neglected in cognitive assessments. However, next to information about stimulus identity we can exploit temporal ordering principles, such as regularity, periodicity, or grouping to generate predictions about the timing of future events. Such predictions may improve cognitive performance by optimising adaptation to dynamic stimuli. Here, we investigated the influence of temporal structure on verbal working memory by assessing immediate recall performance for aurally presented digit sequences (forward digit span) as a function of standard (1000 ms stimulus-onset-asynchronies, SOAs), short (700 ms), long (1300 ms) and mixed (700-1300 ms) stimulus timing during the presentation phase. Participant's digit spans were lower for short and mixed SOA presentation relative to standard SOAs. This confirms an impact of temporal structure on the classic "magical number seven," suggesting that working memory performance can in part be regulated through the systematic application of temporal ordering principles.


Assuntos
Cognição/fisiologia , Memória de Curto Prazo/fisiologia , Adulto , Feminino , Humanos , Masculino , Adulto Jovem
6.
Front Hum Neurosci ; 12: 434, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-30405383

RESUMO

How the brain decomposes and integrates information in multimodal speech perception is linked to oscillatory dynamics. However, how speech takes advantage of redundancy between different sensory modalities, and how this translates into specific oscillatory patterns remains unclear. We address the role of lower beta activity (~20 Hz), generally associated with motor functions, as an amodal central coordinator that receives bottom-up delta-theta copies from specific sensory areas and generate top-down temporal predictions for auditory entrainment. Dissociating temporal prediction from entrainment may explain how and why visual input benefits speech processing rather than adding cognitive load in multimodal speech perception. On the one hand, body movements convey prosodic and syllabic features at delta and theta rates (i.e., 1-3 Hz and 4-7 Hz). On the other hand, the natural precedence of visual input before auditory onsets may prepare the brain to anticipate and facilitate the integration of auditory delta-theta copies of the prosodic-syllabic structure. Here, we identify three fundamental criteria based on recent evidence and hypotheses, which support the notion that lower motor beta frequency may play a central and generic role in temporal prediction during speech perception. First, beta activity must respond to rhythmic stimulation across modalities. Second, beta power must respond to biological motion and speech-related movements conveying temporal information in multimodal speech processing. Third, temporal prediction may recruit a communication loop between motor and primary auditory cortices (PACs) via delta-to-beta cross-frequency coupling. We discuss evidence related to each criterion and extend these concepts to a beta-motivated framework of multimodal speech processing.

7.
Front Psychol ; 8: 96, 2017.
Artigo em Inglês | MEDLINE | ID: mdl-28203214

RESUMO

During natural speech perception, listeners rely on a wide range of cues to support comprehension, from semantic context to prosodic information. There is a general consensus that prosody plays a role in syntactic parsing, but most studies focusing on ambiguous relative clauses (RC) show that prosodic cues, alone, are insufficient to reverse the preferred interpretation of sentence. These findings suggest that universally preferred structures (e.g., Late Closure principle) matter far more than prosodic cues in such cases. This study explores an alternative hypothesis: that the weak effect of prosody might be due to the influence of various syntactic, lexical-semantic, and acoustic confounding factors, and investigate the consequences of prosodic breaks while controlling these variables. We used Spanish RC sentences in three experimental conditions where the presence and position (following the first or second noun phrase) of prosodic breaks was manipulated. The results showed that the placement of a prosodic break determined sentence interpretation by changing the preferred attachment of the RC. Listeners' natural preference for low attachment (in the absence of break) was reinforced when a prosodic break was placed after the first noun. In contrast, a prosodic break placed after the second noun reversed the preferred interpretation of the sentence, toward high attachment. We argue that, in addition to other factors, listeners indeed use prosodic breaks as robust cues to syntactic parsing during speech processing, as these cues may direct listeners toward one interpretation or another.

8.
Neuroimage ; 132: 129-137, 2016 05 15.
Artigo em Inglês | MEDLINE | ID: mdl-26892858

RESUMO

During public addresses, speakers accompany their discourse with spontaneous hand gestures (beats) that are tightly synchronized with the prosodic contour of the discourse. It has been proposed that speech and beat gestures originate from a common underlying linguistic process whereby both speech prosody and beats serve to emphasize relevant information. We hypothesized that breaking the consistency between beats and prosody by temporal desynchronization, would modulate activity of brain areas sensitive to speech-gesture integration. To this aim, we measured BOLD responses as participants watched a natural discourse where the speaker used beat gestures. In order to identify brain areas specifically involved in processing hand gestures with communicative intention, beat synchrony was evaluated against arbitrary visual cues bearing equivalent rhythmic and spatial properties as the gestures. Our results revealed that left MTG and IFG were specifically sensitive to speech synchronized with beats, compared to the arbitrary vision-speech pairing. Our results suggest that listeners confer beats a function of visual prosody, complementary to the prosodic structure of speech. We conclude that the emphasizing function of beat gestures in speech perception is instantiated through a specialized brain network sensitive to the communicative intent conveyed by a speaker with his/her hands.


Assuntos
Lobo Frontal/fisiologia , Gestos , Linguística , Percepção da Fala/fisiologia , Lobo Temporal/fisiologia , Percepção Visual/fisiologia , Adulto , Encéfalo/fisiologia , Sinais (Psicologia) , Feminino , Mãos , Humanos , Imageamento por Ressonância Magnética , Masculino , Adulto Jovem
9.
Front Hum Neurosci ; 9: 527, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-26441618

RESUMO

During social interactions, speakers often produce spontaneous gestures to accompany their speech. These coordinated body movements convey communicative intentions, and modulate how listeners perceive the message in a subtle, but important way. In the present perspective, we put the focus on the role that congruent non-verbal information from beat gestures may play in the neural responses to speech. Whilst delta-theta oscillatory brain responses reflect the time-frequency structure of the speech signal, we argue that beat gestures promote phase resetting at relevant word onsets. This mechanism may facilitate the anticipation of associated acoustic cues relevant for prosodic/syllabic-based segmentation in speech perception. We report recently published data supporting this hypothesis, and discuss the potential of beats (and gestures in general) for further studies investigating continuous AV speech processing through low-frequency oscillations.

10.
Cortex ; 68: 76-85, 2015 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-25595613

RESUMO

Speakers often accompany speech with spontaneous beat gestures in natural spoken communication. These gestures are usually aligned with lexical stress and can modulate the saliency of their affiliate words. Here we addressed the consequences of beat gestures on the neural correlates of speech perception. Previous studies have highlighted the role played by theta oscillations in temporal prediction of speech. We hypothesized that the sight of beat gestures may influence ongoing low-frequency neural oscillations around the onset of the corresponding words. Electroencephalographic (EEG) recordings were acquired while participants watched a continuous, naturally recorded discourse. The phase-locking value (PLV) at word onset was calculated from the EEG from pairs of identical words that had been pronounced with and without a concurrent beat gesture in the discourse. We observed an increase in PLV in the 5-6 Hz theta range as well as a desynchronization in the 8-10 Hz alpha band around the onset of words preceded by a beat gesture. These findings suggest that beats help tune low-frequency oscillatory activity at relevant moments during natural speech perception, providing a new insight of how speech and paralinguistic information are integrated.


Assuntos
Gestos , Comunicação não Verbal/fisiologia , Comunicação não Verbal/psicologia , Percepção da Fala/fisiologia , Fala/fisiologia , Adulto , Percepção Auditiva , Sincronização Cortical , Eletroencefalografia , Feminino , Mãos , Humanos , Masculino , Desempenho Psicomotor/fisiologia , Ritmo Teta , Percepção Visual , Adulto Jovem
11.
Brain Lang ; 124(2): 143-52, 2013 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-23333667

RESUMO

Spontaneous beat gestures are an integral part of the paralinguistic context during face-to-face conversations. Here we investigated the time course of beat-speech integration in speech perception by measuring ERPs evoked by words pronounced with or without an accompanying beat gesture, while participants watched a spoken discourse. Words accompanied by beats elicited a positive shift in ERPs at an early sensory stage (before 100 ms) and at a later time window coinciding with the auditory component P2. The same word tokens produced no ERP differences when participants listened to the discourse without view of the speaker. We conclude that beat gestures are integrated with speech early on in time and modulate sensory/phonological levels of processing. The present results support the possible role of beats as a highlighter, helping the listener to direct the focus of attention to important information and modulate the parsing of the speech stream.


Assuntos
Potenciais Evocados/fisiologia , Gestos , Percepção da Fala/fisiologia , Percepção Visual/fisiologia , Adulto , Mapeamento Encefálico , Eletroencefalografia , Feminino , Humanos , Masculino , Fala/fisiologia , Inquéritos e Questionários
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...